# Multi-Turn Dialogue Optimization
Llama 3 3 Nemotron Super 49B V1 GGUF
Other
Llama-3.3-Nemotron-Super-49B-v1 is a large language model, improved upon Meta Llama-3.3-70B-Instruct, with enhanced reasoning capabilities, human chat preferences, and task execution abilities, supporting a context length of 128K tokens.
Large Language Model
Transformers English

L
unsloth
814
1
Oumuamua 7b Instruct V2
Apache-2.0
Oumuamua-7b Instruction-Tuned Version v2 is a Japanese and English text generation model that integrates multiple pre-trained language models, with special enhancements in role-playing and multi-turn dialogue capabilities.
Large Language Model
Transformers Supports Multiple Languages

O
nitky
39
24
Instella 3B
Other
AMD's fully open 3-billion-parameter language model family trained on Instinct MI300X GPUs, outperforming open models of similar scale
Large Language Model
Transformers

I
amd
3,048
34
Llama 3 Soliloquy 8B V2
High-performance role-playing model designed for immersive dynamic experiences, supporting 24k context length
Large Language Model
Transformers English

L
elyn-dev
95
56
Meta Llama 3 70B Fp8
Other
Meta Llama 3 70B is a large language model developed by Meta, featuring 70 billion parameters and supporting an 8k context length, designed for English-language business and research applications.
Large Language Model
Transformers English

M
FriendliAI
34
5
Telechat 7B
Apache-2.0
TeleChat is a large language model developed and trained by China Telecom AI Technology Co., Ltd. The 7B model base is trained on 1.5 trillion tokens of high-quality Chinese and English corpus, while the 12B model base is trained on 3 trillion tokens of high-quality Chinese and English corpus.
Large Language Model
Transformers

T
Tele-AI
238
108
Featured Recommended AI Models